Truncated-Newton Training Algorithm for

نویسندگان

  • M. S. Al-Haik
  • H. Garmestani
چکیده

We present an estimate approach to compute the viscoplastic behavior of a polymer matrix composite (PMC) under different thermomechanical environments. This investigation incorporates computational neural network as the tool for deter-mining the creep behavior of the composite. We propose a new second-order learning algorithm for training the multilayer networks. Training in the neural network is generally specified as the minimization of an appropriate error function with respect to parameters of the network (weights and learning rates) corresponding to excitory and inhibitory connections. We propose here a technique for error minimization based on the use of the truncated Newton (TN) large-scale unconstrained minimization technique with quadratic convergence rate. This technique offers a more sophisticated use of the gradient information compared to simple steepest descent or conjugate gradient methods. In this work we briefly specify the necessary details for implementing the truncated Newton method for training the neural networks that predicts the viscoplastic behavior of the polymeric composite. We provide comparative experimental results and explicit model results to verify the effectiveness of the neural networks-based model. These results verify the 1 Corresponding author, Tel: 850-644-6560. E-mail address: [email protected].

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Truncated-Newton Training Algorithm for Neurocomputational Viscoplastic Model

We present an estimate approach to compute the viscoplastic behavior of a polymeric composite under different thermomechanical approaches. This investagation incorporates computational neural network as the tool for determining the creep behaviour of the composite. We propose a new second-order learning algorithm for training the multilayer networks. Training in the neural network is generally ...

متن کامل

Large Scale Empirical Risk Minimization via Truncated Adaptive Newton Method

We consider large scale empirical risk minimization (ERM) problems, where both the problem dimension and variable size is large. In these cases, most second order methods are infeasible due to the high cost in both computing the Hessian over all samples and computing its inverse in high dimensions. In this paper, we propose a novel adaptive sample size second-order method, which reduces the cos...

متن کامل

Preconditioned Conjugate Gradient Methods in Truncated Newton Frameworks for Large-scale Linear Classification

Truncated Newton method is one of the most effective optimization methods for large-scale linear classification. The main computational task at each Newton iteration is to approximately solve a quadratic sub-problem by an iterative procedure such as the conjugate gradient (CG) method. It is known that CG has slow convergence if the sub-problem is ill-conditioned. Preconditioned CG (PCG) methods...

متن کامل

Training Large Neural Networks

We describe regularization tools for training large-scale artiicial feed-forward neural networks. We propose algorithms that explicitly use a sequence of Tikhonov regularized nonlinear least squares problems. For large-scale problems, methods using new special purpose automatic diierentiation are used in a conjugate gradient method for computing a truncated Gauss-Newton search direction. The al...

متن کامل

Training Deep and Recurrent Networks with Hessian-Free Optimization

Hessian-Free optimization (HF) is an approach for unconstrained minimization of real-valued smooth objective functions. Like standard Newton’s method, it uses local quadratic approximations to generate update proposals. It belongs to the broad class of approximate Newton methods that are practical for problems of very high dimensionality, such as the training objectives of large neural networks...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003